151 research outputs found

    What you need to know about: delirium in older adults in hospital

    Get PDF
    Delirium is a clinical syndrome characterised by a disturbance of perception, consciousness and/or cognitive function, with an acute onset, fluctuating course and a severe deterioration arising over hours or days. Delirium is usually triggered by a combination of influences including acute illness, surgery, drugs and environmental factors. It is commonly seen in older people presenting to hospital, but can also develop during hospitalisation. There are three types of delirium: hypoactive, hyperactive and mixed. All patients over 65 years old presenting to hospital should be screened for delirium using the ‘4AT’ tool. An alternate method for diagnosing hospital-acquired delirium is described. This article outlines a 10-stage method for diagnosing, managing and preventing delirium, with emphasis on which areas of the history and examination should be prioritised, what the salient investigations are and both non-pharmacological and pharmacological approaches to preventing and treating delirium. Finally, this article explores which patients require specialist referrals or investigations and how to best follow up patients with delirium

    Early diagnosis and treatment of HIV infection: magnitude of benefit on short-term mortality is greatest in older adults.

    Get PDF
    BACKGROUND: the number and proportion of adults diagnosed with HIV infection aged 50 years and older has risen. This study compares the effect of CD4 counts and anti-retroviral therapy (ART) on mortality rates among adults diagnosed aged ≥50 with those diagnosed at a younger age. METHODS: retrospective cohort analysis of national surveillance reports of HIV-diagnosed adults (15 years and older) in England, Wales and Northern Ireland. The relative impacts of age, CD4 count at diagnosis and ART on mortality were determined in Cox proportional hazards models. RESULTS: among 63,805 adults diagnosed with HIV between 2000 and 2009, 9% (5,683) were aged ≥50 years; older persons were more likely to be white, heterosexual and present with a CD4 count <200 cells/mm(3) (48 versus 32% P < 0.01) and AIDS at diagnosis (19 versus 9%, P < 0.01). One-year mortality was higher in older adults (10 versus 3%, P < 0.01) and especially in those diagnosed with a CD4 <200 cells/mm(3) left untreated (46 versus 15%, P < 0.01). While the relative mortality risk reduction from ART initiation at CD <200 cells/mm(3) was similar in both age groups, the absolute risk difference was higher among older adults (40 versus 12% fewer deaths) such that the number needed to treat older adults to prevent one death was two compared with eight among younger adults. CONCLUSIONS: the magnitude of benefit from ART is greater in older adults than younger adults. Older persons should be considered as a target for HIV testing. Coupled with prompt treatment, earlier diagnosis is likely to reduce substantially deaths in this group

    Associations between factors across life and balance ability in mid and later life: evidence from a British birth cohort study

    Get PDF
    Introduction: Despite its associations with falls, disability, and mortality, balance is an under-recognized and frequently overlooked aspect of aging. Studies investigating associations between factors across life and balance are limited. Understanding the factors related to balance performance could help identify protective factors and appropriate interventions across the life course. This study aimed to: (i) identify socioeconomic, anthropometric, behavioral, health, and cognitive factors that are associated with one-legged balance performance; and (ii) explore how these associations change with age. Methods: Data came from 3,111 members of the MRC National Survey of Health and Development, a British birth cohort study. Multilevel models examined how one-legged standing balance times (assessed at ages 53, 60–64, and 69) were associated with 15 factors across life: sex, maternal education (4 years), paternal occupation (4 years), own education (26 years), own occupation (53 years), and contemporaneous measures (53, 60–64, 69 years) of height, BMI, physical activity, smoking, diabetes, respiratory symptoms, cardiovascular events, knee pain, depression and verbal memory. Age and sex interactions with each variable were assessed. Results: Men had 18.8% (95%CI: 13.6, 23.9) longer balance times than women at age 53, although this difference decreased with age (11.8% at age 60–64 and 7.6% at age 69). Disadvantaged socioeconomic position in childhood and adulthood, low educational attainment, less healthy behaviors, poor health status, lower cognition, higher body mass index (BMI), and shorter height were associated with poorer balance at all three ages. For example, at age 53, those from the lowest paternal occupational classes had 29.6% (22.2, 38.8) worse balance than those from the highest classes. Associations of balance with socioeconomic indicators, cognition and physical activity became smaller with age, while associations with knee pain and depression became larger. There were no sex differences in these associations. In a combined model, the majority of factors remained associated with balance. Discussion: This study identified numerous risk factors across life that are associated with one-legged balance performance and highlighted diverse patterns of association with age, suggesting that there are opportunities to intervene in early, mid and later life. A multifactorial approach to intervention, at both societal and individual levels, may have more benefit than focusing on a single risk factor

    The effects of computerised cognitive training on post-CABG delirium and cognitive change: A prospective randomised controlled trial

    Get PDF
    Background: Cognitive impairments, including delirium, are common after coronary artery bypass grafting (CABG). Improving cognition pre- and post-operatively using computerised cognitive training (CCT) may be an effective approach to improve cognitive outcomes in CABG patients. Objectives: Investigate the effect of remotely supervised CCT on cognitive outcomes, including delirium, in older adults undergoing CABG surgery. Methods: Thirty-six participants, were analysed in a single-blinded randomised controlled trial (CCT Intervention: n = 18, Control: n = 18). CCT was completed by the intervention group pre-operatively (every other day, 45–60-minute sessions until surgery) and post-operatively, beginning 1-month post-CABG (3 x 45–60-minute sessions/week for 12-weeks), while the control group maintained usual care plus weekly phone calls. Cognitive assessments were conducted pre- and post-operatively at multiple follow-ups (discharge, 4-months and 6-months). Post-operative delirium incidence was assessed daily until discharge. Cognitive change data were calculated at each follow-up for each cognitive test (Addenbrooke’s Cognitive Examination III and CANTAB; z-scored). Results: Adherence to the CCT intervention (completion of three pre-operative or 66% of post-operative sessions) was achieved in 68% of pre-CABG and 59% of post-CABG participants. There were no statistically significant effects of CCT on any cognitive outcome, including delirium incidence. Conclusion: Adherence to the CCT program was comparatively higher than previous feasibility studies, possibly due to the level of supervision and support provided (blend of face-to-face and home-based training, with support phone calls). Implementing CCT interventions both pre- and post-operatively is feasible in those undergoing CABG. No statistically significant benefits from the CCT interventions were identified for delirium or cognitive function post-CABG, likely due to the sample size available (study recruitment greatly impacted by COVID-19). It also may be the case that multimodal intervention would be more effective

    Straight and Divergent Pathways to Cognitive State: Seven Decades of Follow-Up in the British 1946 Birth Cohort

    Get PDF
    BACKGROUND: Using the British 1946 birth cohort we previously estimated life course paths to the Addenbrooke's Cognitive Examination (ACE-III). OBJECTIVE: We now compared those whose ACE-III scores were expected, worse and better than predicted from the path model on a range of independent variables including clinical ratings of cognitive impairment and neuroimaging measures. METHODS: Predicted ACE-III scores were categorized into three groups: those with Expected (between -1.5 and 1.5 standard deviation; SD); Worse (1.5 SD) scores. Differences in the independent variables were then tested between these three groups. RESULTS: Compared with the Expected group, those in the Worse group showed independent evidence of progressive cognitive impairment: faster memory decline, more self-reported memory difficulties, more functional difficulties, greater likelihood of being independently rated by experienced specialist clinicians as having a progressive cognitive impairment, and a cortical thinning pattern suggestive of preclinical Alzheimer's disease. Those in the Better group showed slower verbal memory decline and absence of independently rated progressive cognitive impairment compared to the Expected group, but no differences in any of the other independent variables including the neuroimaging variables. CONCLUSION: The residual approach shows that life course features can map directly to clinical diagnoses. One future challenge is to translate this into a readily usable algorithm to identify high-risk individuals in preclinical state, when preventive strategies and therapeutic interventions may be most effective

    Protocol for the Delirium and Cognitive Impact in Dementia (DECIDE) study: A nested prospective longitudinal cohort study.

    Get PDF
    BACKGROUND: Delirium is common, affecting at least 20% of older hospital inpatients. It is widely accepted that delirium is associated with dementia but the degree of causation within this relationship is unclear. Previous studies have been limited by incomplete ascertainment of baseline cognition or a lack of prospective delirium assessments. There is an urgent need for an improved understanding of the relationship between delirium and dementia given that delirium prevention may plausibly impact upon dementia prevention. A well-designed, observational study could also answer fundamental questions of major importance to patients and their families regarding outcomes after delirium. The Delirium and Cognitive Impact in Dementia (DECIDE) study aims to explore the association between delirium and cognitive function over time in older participants. In an existing population based cohort aged 65 years and older, the effect on cognition of an episode of delirium will be measured, independent of baseline cognition and illness severity. The predictive value of clinical parameters including delirium severity, baseline cognition and delirium subtype on cognitive outcomes following an episode of delirium will also be explored. METHODS: Over a 12 month period, surviving participants from the Cognitive Function and Ageing Study II-Newcastle will be screened for delirium on admission to hospital. At the point of presentation, baseline characteristics along with a number of disease relevant clinical parameters will be recorded. The progression/resolution of delirium will be monitored. In those with and without delirium, cognitive decline and dementia will be assessed at one year follow-up. We will evaluate the effect of delirium on cognitive function over time along with the predictive value of clinical parameters. DISCUSSION: This study will be the first to prospectively elucidate the size of the effect of delirium upon cognitive decline and incident dementia. The results will be used to inform future dementia prevention trials that focus on delirium intervention

    Establishing the precise evolutionary history of a gene improves prediction of disease-causing missense mutations

    Get PDF
    PURPOSE: Predicting the phenotypic effects of mutations has become an important application in clinical genetic diagnostics. Computational tools evaluate the behavior of the variant over evolutionary time and assume that variations seen during the course of evolution are probably benign in humans. However, current tools do not take into account orthologous/paralogous relationships. Paralogs have dramatically different roles in Mendelian diseases. For example, whereas inactivating mutations in the NPC1 gene cause the neurodegenerative disorder Niemann-Pick C, inactivating mutations in its paralog NPC1L1 are not disease-causing and, moreover, are implicated in protection from coronary heart disease. METHODS: We identified major events in NPC1 evolution and revealed and compared orthologs and paralogs of the human NPC1 gene through phylogenetic and protein sequence analyses. We predicted whether an amino acid substitution affects protein function by reducing the organism’s fitness. RESULTS: Removing the paralogs and distant homologs improved the overall performance of categorizing disease-causing and benign amino acid substitutions. CONCLUSION: The results show that a thorough evolutionary analysis followed by identification of orthologs improves the accuracy in predicting disease-causing missense mutations. We anticipate that this approach will be used as a reference in the interpretation of variants in other genetic diseases as well. Genet Med 18 10, 1029–1036
    • …
    corecore